Do you use quizzes or written assignments in your teaching? If so, you may be thinking about the impact of ChatGPT this session, and wondering what others across campus might be doing.
If so, you are in the right place. We’ve done the legwork for you and asked MQ teachers about their approaches to making quizzes and written tasks more robust in the short term while they consider if and how they might change assessment tasks in the future.
Note – an update Nov 2023: Strategies such as using very current sources i.e within the past 3 months, can reduce but not eliminate the ability for Generative AI tools to respond. There are now multiple generative AI tools that can include web search results in the response for the user. Similarly AI tools can now interpret images and diagrams for the user that may be part of a question. However including these still adds richness to the task.
Making quizzes more robust this session
Morwenna Kirwan (Director of Education in the Department of Health Sciences)
In some of the units in my course, the Curriculum Management system does not specify whether a mid-session quiz is invigilated or an open or closed book. Thus, we have flexibility to have mid-session quizzes in scheduled tutorial time and will use a hybrid (paper and iLearn quiz) method to have the best of both worlds.
The best of both worlds
We plan to provide students with a paper-based exam and get them to enter their responses in a skeleton iLearn quiz; it’s a quiz where instead of the question prompts, students will see “Question 1 – a/b/c/d”, “Question 2/ a/b/c/d”. Such a set-up means that students can’t screen capture, and due to invigilation, we are watching screens and students cannot easily copy-paste the prompts into ChatGPT.
At the same time, we’ll retain the key advantages of the iLearn quiz (automatic marking for multiple-choice questions and the convenience of typing as opposed to handwriting, which is important for both students and markers as reading handwritten text can be more challenging).
We’ll ask students to bring a fully charged device as well as a power cord. We’ve ensured our learning space has sufficient powerpoints and will spread the desks accordingly. We will also be collecting quiz papers afterwards – so we can roll this out across many tutorial groups over the week. This approach was something we were planning on doing even before ChatGPT was released. It’ll provide us with better assurance of learning and will still retain the convenience of avoiding hand-written work.
Interested in doing something similar?
Step 1: Communicate important dates to students early (e.g., you’ll have a mid-session quiz week 6) to make sure that they can attend.
Step 2: Create 2 versions of the quiz: (1) a fully worded paper version that students will be provided with on the day and (2) a ‘blank’ iLearn quiz with no question/answer wording and students will use to select their answers. Make sure that your paper version and your iLearn versions have consistent structure, e.g., that both of them have either 1.2.3.4 as potential options or ABCD.
Step 3: Remind students to bring a fully charged device and a power cord just in case.
Step 4: Have at least 1-2 spare devices just in case and also offer the student the option to complete the mid-session quiz on paper should their device fail.
Re-word prompts for higher order skills
Nicholas Tse (Senior Lecturer, School of Engineering)
I tested my initial quizzes in ChatGPT and it got most of them right. However, when I changed the wording for “Which of the 2 are most probable” or “Pick 4 out of 6”, ChatGPT didn’t do so well. It’s probably because being a text-generator, ChatGPT lacks analytical skills required for responding to questions that require higher-order thinking like picking a better option.
Interested in doing something similar?
Step 1: Test-drive your quiz questions in ChatGPT
Step 2: Try alternative wording that will go beyond knowledge-check and will include some analysis, e.g. “Which is less likely…” or “Which is more suitable for [provide context], etc.
Fresh data, visuals and context are key
Essays (or similar) are another common yet vulnerable assessment type. Colleagues around campus shared their ideas/tips for making written tasks less vulnerable this session.
Morwenna Kirwan (Director of Education in the Department of Health Sciences)
Apparently, ChatGPT’s knowledge is cut off at 2021, so for this session I plan to provide students with some 2022 readings and ask that students reference them in their essays.
It doesn’t mean that students won’t use ChatGPT, but it does mean that they’ll need to engage with 2022 readings in order to incorporate them in their essays. Longer term, however, we’ll be stepping away from essays and will be looking at other assessment methods which is something I have wanted to do anyway.
Narelle Hess (Lecturer, Psychology)
For a case study assessment task, we are providing students with recent information sources, such as readings, company reports, and research data. These information sources are from 2022-2023 and are therefore outside of ChatGPT training data. In this way, students are being asked to use their critical thinking skills to connect these different information sources and combine them with their own research to inform their recommendations for this organisational case study. This unit is also inviting students to engage with AI in one of the tutorial classes, to help students better understand how AI may help employees at work, and where there are limitations.
Nicholas Tse (Senior Lecturer, School of Engineering)
When I tested ChatGPT, it did fairly well on bigger and more abstract topics but didn’t do so well on recent and local examples, so this session I plan to modify the assessments to have students include recent and localized examples rather than allowing students to keep the interpretation open and vague.. In the future, we might re-think how we approach assurance of learning and might be able to focus on higher-order skills rather than knowledge checks, etc.
Visual in tasks
A/Prof Wylie Bradford (Economics)
I often provide students with visual prompts, like a model of a diagram and ask them to apply them to recent events. ChatGPT can’t currently deal with visuals well, and students will probably find that whatever they get out of the tool is not very good. Also, since ChatGPT has been trained on pre-2021 data, which is yet another reason to use something that we know motivates students – extreme currency.
Interested in doing something similar?
Apart from bringing recent references into assessment requirements, you can contextualize your assessment by using recent events or a local example. For example, you can provide 2022-2023 news (e.g., NSW trains industrial action, the war in Ukraine, 2022 ruling of the Supreme Court on abortion rights, the passing of Queen Elizabeth, etc.), and ask students to analyze it using the frameworks from your subject.
You can also ask them to identify a local example (e.g., a community event, organization, etc.) and use it as a case study for analysis. It’s likely that ChatGPT will not have sufficient data on a small local event/community to produce a decent response.
And, if possible, consider using visuals in your assessment prompts.
On-going check-ins provide support and accountability
Nicholas Tse (Senior Lecturer, School of Engineering)
In one of my other units MECH4001, we have students submitting a paper-based logbook to document progress and consistent work output. It also encourages reflection as students review their entries. One of the nice ‘side-effects’ of using a logbook is keeping students accountable for their work and avoiding ‘generative the product’ at the last minute.
Nandini Krishna Kumar (Lecturer, Accounting)
While we can’t change assessment tasks this session, we can do regular check-ins with students during tutorials. For example, I’ll allocate some time in each tutorial/seminar to a ‘work-in-progress’ discussion. Not only will it allow me to provide guidance and support to students on their assignments currently being worked on, but it will also show if a student is falling behind and is not progressing with their work.
There are many benefits to having such discussions. For assignments stretching over multiple weeks, it will help to ensure that the assignment is being progressively worked on over the duration of the assignment and it also helps to map the progress of the task. The time can also be used as a conversation starter with peers and provide visibility and accountability that can reduce the risk of academic dishonesty.
A/Prof Wylie Bradford (Economics)
One approach which I’ve used for a long time is to ask students write classroom reflections. They are much harder to produce if you have not attended a class discussion. I plan to continue with this approach as it would work well for ChatGPT-proofing the assessments as well.
Using ChatGPT in teaching
A/Prof Wylie Bradford (Economics)
I would be cautious about talking to students about ChatGPT and trying to dissuade them from using it, as it might backfire and create a Streisand effect.
Nicholas Tse (Senior Lecturer, School of Engineering)
I am going to assume that students are going to use ChatGPT and work with that, and, in fact, I think it’s important that we do work with ChatGPT. Moving forward, I’m pretty sure that managers in the workplace will have increased expectation for the work output, so why not start while students are still at university?
I was thinking of generating a response to an assignment task in AI and getting students to analyse and comment on it in class.
Also, ChatGPT can be a learning tool. It’s good at providing summaries and students may be able to process more readings and get explanations from ChatGPT until a concept is clear to them. However, it is important to help students develop AI literacy to be a ‘savvy’ AI consumer and be able to tell ‘true’ from ‘untrue’ information which currently happens quite frequently with AI.
Agi Bodis (Lecturer, Linguistics)
Preparing for future careers:
I am keen to address how ChatGPT will be useful for my students in their professional careers as future language teachers, particularly in the areas of lesson planning, creating teaching resources and assessment tasks to be used in class. With my students, I’m planning to look at how to generate texts aimed at a certain language proficiency level and used as teaching resources, how to change the genre and I’m keen to see what additional ideas my students may have on making the work of a language teacher more efficient. The assessment tasks mirror the kind of activities a language teacher does so if ChatGPT has a place in it, it’s unavoidable; however, just like ChatGPT won’t replace a language teacher, you won’t be able to pass these assignments by using AI only.
Jacqueline Mackaway (Lecturer, Sociology)
I’m starting from a place where I’m not going to assume my students are all digital natives and will be early adopters of ChatGPT – nor am I going to assume that they’re planning to use it to cheat. This doesn’t mean I’m going to ignore this tool or pretend it’s not going to play some role in students’ lives now and into the future.
I teach a PACE unit, so with this in mind I will be engaging with both students and industry partners to understand the affordances, limitations, risks and ethics of using the tool in both the classroom and workplace.
Some very early steps taken so far this semester include asking (via survey) my industry-based workplace supervisors about their use (or non-use), potential benefits, risks and challenges related to the tool. Their feedback is still coming in now and will be incorporated into a class discussion with students occurring in week 4. Part of that discussion will centre on the ethics of such tools and when it is and isn’t okay to use ChatGPT as part of their learning and work.
Between now and semester 2 assessment tasks will also be reviewed to consider if/how they may need revising. Part of this review will involve incorporating student and workplace supervisor perspectives on ChatGPT and its role in supporting learning and work. It could be that both parties end up involved in the co-design of assessments tasks – something that aligns with my own philosophy of student-centered learning. I’m still investigating the tool and how to ‘work with it’ as I don’t want to rush into making unnecessary or poorly conceived changes.
Join the conversation!
Join us on Thursday, 30th March 2023 from 13:00-14:00 for the MQ Community Roundtable: Gen AI tools – implications for learning, teaching and assessment at Macquarie.
Can’t attend yourself? Share this post with others!
Header image Designed by Freepik (original modified for this article)
Share this: